Grouped Knowledge Distillation for Deep Face Recognition

نویسندگان

چکیده

Compared with the feature-based distillation methods, logits can liberalize requirements of consistent feature dimension between teacher and student networks, while performance is deemed inferior in face recognition. One major challenge that light-weight network has difficulty fitting target due to its low model capacity, which attributed significant number identities Therefore, we seek probe extract primary knowledge related identity, discard others, make more achievable for network. Specifically, there a tail group near-zero values prediction, containing minor distillation. To provide clear perspective impact, first partition into two groups, i.e., Primary Group Secondary Group, according cumulative probability softened prediction. Then, reorganize Knowledge Distillation (KD) loss grouped three parts, Primary-KD, Secondary-KD, Binary-KD. Primary-KD refers distilling from teacher, Secondary-KD aims refine but increases distillation, Binary-KD ensures consistency distribution student. We experimentally found (1) are indispensable KD, (2) culprit restricting KD at bottleneck. propose Grouped (GKD) retains omits ultimate calculation. Extensive experimental results on popular recognition benchmarks demonstrate superiority proposed GKD over state-of-the-art methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Face Recognition

The goal of this paper is face recognition – from either a single photograph or from a set of faces tracked in a video. Recent progress in this area has been due to two factors: (i) end to end learning for the task using convolutional neural networks (CNNs), and (ii) the availability of very large scale training datasets. We make two contributions: first, we show how a very large scale dataset ...

متن کامل

Data-Free Knowledge Distillation for Deep Neural Networks

Recent advances in model compression have provided procedures for compressing large neural networks to a fraction of their original size while retaining most if not all of their accuracy. However, all of these approaches rely on access to the original training set, which might not always be possible if the network to be compressed was trained on a very large dataset, or on a dataset whose relea...

متن کامل

Face Recognition using PCA, Deep Face Method

The performance process of face recognition involves the inspection study of facial features in an image, recognizing those features and comparing them to one of the many faces in the database. There are many algorithms capable of performing face recognition; such as: Principal Component Analysis, Discrete Cosine Transform, 3D recognition methods, Gabor Wavelets method etc. There were many issu...

متن کامل

Coupled Deep Learning for Heterogeneous Face Recognition

Heterogeneous face matching is a challenge issue in face recognition due to large domain difference as well as insufficient pairwise images in different modalities during training. This paper proposes a coupled deep learning (CDL) approach for the heterogeneous face matching. CDL seeks a shared feature space in which the heterogeneous face matching problem can be approximately treated as a homo...

متن کامل

Deep Attributes for One-Shot Face Recognition

We address the problem of one-shot unconstrained face recognition. This is addressed by using a deep attribute representation of faces. While face recognition has considered the use of attribute based representations, for one-shot face recognition, the methods proposed so far have been using different features that represent the limited example available. We postulate that by using an intermedi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i3.25472